CHING-YI TSAI

Animated Daily Object
Animated Everyday Objects with 6-DoF Social Body Gestures
CLASS PROJECT | 18 JUN 2021

What if everyday objects can communicate in social body gestures?

Intro

Body gestures are key in human communication, yet as everyday objects become more integrated with ubiquitous computing and IoT, their interactions are often confined to voice or screen displays, like in smart speakers. This project investigates the use of body gestures in these everyday objects. We began with a survey study across 12 everyday scenarios to understand people’s expectations of objects’ gestural communications. Based on this feedback and previous research, we designed body gestures incorporating six degrees of freedom (6DoF) movements. We then conducted a study with 14 participants to see if these gestures effectively convey various social intentions in daily events. The study revealed that these body gestures were successful in expressing diverse social intents, and the 6DoF movements proved to be a non-intrusive way to enhance everyday objects without the need for complete redesigns.

Design Everyday Objects’ Body Gesture

To design a set of applicable, general enough body gesture for everyday objects, we propose 12 different daily scenario to users and let them design the corresponding social intention and gesture that an object should perform from two different perspective: the object’s perspective (animalism / anthropomorphism) and the user’s perspective (functionality).

Users designing objects' body gestures with their own everyday objects.

Gesture Space

Combining user feedback and prior work, we design the following 6-DoF gesture to express various social intention.

Body gestures in 6DoF.

Evaluation

To evaluate our result, we compare our designed gestures with a baseline 3DoF version regarding distinguishability of social intention and users’ subjective experience.

Confusion matrix for 6DoF vs 3DoF's gesture distinguishability by users.

This is a project I conducted in Advanced Interative Technology class during 2021 Spring, which inspires my later project.

Contributor: Ching Yi, Tsai and Cheng Hsun, Ho.